Reading Camera Data
Elias Groot
Software Lead, Course Organizer
The official ASE controller service forms the bridge between pre-processed camera data (read from the imaging service) and actuator control data (read by the actuator service). We will replicate this in our dummy controller project, so we need to get the camera first.
Creating a Read Stream
To retrieve the pre-processed camera data, we need to create a read stream that reads from the imaging service. To initialize a read stream using the roverlib, you just need to specify the exact name of the service (case sensitive) and the stream you want to read from this service.
How do we know which stream we want from the imaging service? Easy, we just look at the outputs
field in its service.yaml definition:
...
outputs:
- path
...
In this case, there is only one stream that we can read from: path
. We initialize our read stream as follows:
// "I am the dummy controller, and I want to read from the 'path' stream from the 'imaging' service"
read_stream *imaging = get_read_stream(&service, "imaging", "path");
if (imaging == NULL) {
printf("Could not get read stream for 'path' from 'imaging'\n");
return 1;
}
From a read stream, we can... read. Let's try it out.
Reading From a Read Stream
With our new read stream captured in the imaging
variable, we can start reading data from the imaging service. For reading, there are two methods available:
read_pb()
which will read bytes from imaging and decode the binary message into aSensorOutput
Go struct from therovercom
package, as defined here.read_bytes()
which will read bytes from imaging and return these to the caller raw
Since we know that the imaging service will send a CameraSensorOutput
message the easiest method for reading is to just use imaging.Read()
.
Take a moment to look at the protobuf definitions for the CameraSensorOutput
message and the general SensorOutput
wrapper message. You should understand the message contents so that you can use them in your service code.
Reading is then done as follows:
// Read one message from the stream
ProtobufMsgs__SensorOutput *msg = read_pb(imaging);
if (msg == NULL) {
cprintf("Could not read from input example-input\n");
}
Easy right? The task now is to unpack our message. Recall that msg
holds the contents of a SensorOutput
wrapper message since we cannot easily discriminate unions with Protobuf. It contains some useful metadata, like the timestamp at which the imaging service created the captured message:
// When did imaging service create this message?
int createdAt = msg->timestamp;
printf("Received message with timestamp: %d\n", createdAt);
But what we are really interested in is the camera data. Accessing it can be done using the case enum that the protoc
compiler generated for us, after which we can access the union type of the generated struct.
// Get the camera data
if (msg->sensor_output_case != PROTOBUF_MSGS__SENSOR_OUTPUT__SENSOR_OUTPUT_CAMERA_OUTPUT) {
printf("Received sensor data, but not camera data\n");
} else {
ProtobufMsgs__CameraSensorOutput *camera_output = msg->cameraoutput;
printf("Imaging service captured a %d by %d image\n", camera_output->trajectory->width, camera_output->trajectory->height);
}
From the CameraSensorOutput
message definition, we know that the imaging service sends a Trajectory
with an array of Point
objects. These Point
s are just X/Y-coordinate pairs that represent the detected edges of the track (within the Trajectory.Width
and Trajectory.Height
camera frame).
To decide how to steer, let's introduce a new variable: steerPosition
, which is a float and represents the angle in which the Rover's servo should move. -1.0
represents steering all the way left, 0.0
represents centering and 1.0
represents steering all the way right.
Let's take the first two Point
s that the imaging service captured and find the midpoint between them. Ideally, this midpoint should be perfectly in the middle of (0
and Trajectory.Width
), and if it is off, the Rover must be off-center. Let's use this heuristic to steer on.
// This value holds the steering position that we want to pass to the servo (-1 = left, 0 = center, 1 = right)
float steerPosition = 0;
// Find the desired mid point (x) of the captured image
int desiredMidpoint = camera_output->trajectory->width / 2;
// Compute the actual mid point between the left and right edges of the detected track
if (camera_output->trajectory->n_points < 2) {
printf("Not enough track edge points to compute the mid point\n");
} else {
// Compute the mid point (assuming [0] is the left edge and [1[] is the right edge)
int actualMidpoint = (camera_output->trajectory->points[1]->x - camera_output->trajectory->points[0]->x) / 2 + camera_output->trajectory->points[0]->x;
// Compute the error
int midpointErr = actualMidpoint - desiredMidpoint;
// Compute the steering position
steerPosition = (float)midpointErr / (float)desiredMidpoint;
}
Easy enough right? We will send the steerPosition
to the servo in the next chapter. Before we do that, we need to wrap the message receiving in an infinite loop, so that we keep reading from the imaging service as soon as it has data available. Doing so, your project code should look like this:
#include <roverlib.h>
#include <sys/time.h>
#include <unistd.h>
long long current_time_millis() {
struct timeval tv;
gettimeofday(&tv, NULL);
return (long long)(tv.tv_sec) * 1000 + (tv.tv_usec) / 1000;
}
// The main user space program
// this program has all you need from roverlib: service identity, reading, writing and configuration
int user_program(Service service, Service_configuration *configuration) {
// "I am the dummy controller, and I want to read from the 'path' stream from the 'imaging' service"
read_stream *imaging = get_read_stream(&service, "imaging", "path");
if (imaging == NULL) {
printf("Could not get read stream for 'path' from 'imaging'\n");
return 1;
}
while (true) {
// Read one message from the stream
ProtobufMsgs__SensorOutput *msg = read_pb(imaging);
if (msg == NULL) {
printf("Could not read from input example-input\n");
}
// When did imaging service create this message?
int createdAt = msg->timestamp;
printf("Received message with timestamp: %d\n", createdAt);
// Get the camera data
if (msg->sensor_output_case != PROTOBUF_MSGS__SENSOR_OUTPUT__SENSOR_OUTPUT_CAMERA_OUTPUT) {
printf("Received sensor data, but not camera data\n");
} else {
ProtobufMsgs__CameraSensorOutput *camera_output = msg->cameraoutput;
printf("Imaging service captured a %d by %d image\n", camera_output->trajectory->width, camera_output->trajectory->height);
// This value holds the steering position that we want to pass to the servo (-1 = left, 0 = center, 1 = right)
float steerPosition = 0;
// Find the desired mid point (x) of the captured image
int desiredMidpoint = camera_output->trajectory->width / 2;
// Compute the actual mid point between the left and right edges of the detected track
if (camera_output->trajectory->n_points < 2) {
printf("Not enough track edge points to compute the mid point\n");
} else {
// Compute the mid point (assuming [0] is the left edge and [1[] is the right edge)
int actualMidpoint = (camera_output->trajectory->points[1]->x - camera_output->trajectory->points[0]->x) / 2 + camera_output->trajectory->points[0]->x;
// Compute the error
int midpointErr = actualMidpoint - desiredMidpoint;
// Compute the steering position
steerPosition = (float)midpointErr / (float)desiredMidpoint;
}
}
}
}
// This is just a wrapper to run the user program
// it is not recommended to put any other logic here
int main() {
return run(user_program);
}
In the infinite loop above, read_pb(imaging)
is a blocking call. It will only return once it received a message from the imaging service. This means that if the imaging service sends data at 30 FPS, your main loop will run at ~30 iterations a second. If you need to perform time-critical tasks in the meantime, consider using a goroutine to run code in parallel.
Almost there now, but there is one more thing we need to do: tell roverd
to resolve the read stream of the imaging service for us. We do this by modifying our service.yaml configuration.
Declaring a Read Stream
By defining the path
stream as an input from the imaging service, roverd can analyze our pipeline and later resolve the service dependencies for us. All we need to do is to modify the inputs
field:
...
inputs:
- service: imaging
streams:
- path
...
That's all! Go ahead and save your changes. If your Rover is powered on, try to sync your service files. roverctl
will show you an error if your files or service.yaml definition is invalid.
roverctl service sync